Learnability and the doubling dimension
نویسندگان
چکیده
Given a set of classifiers and a probability distribution over their domain, one can define a metric by taking the distance between a pair of classifiers to be the probability that they classify a random item differently. We prove bounds on the sample complexity of PAC learning in terms of the doubling dimension of this metric. These bounds imply known bounds on the sample complexity of learning halfspaces with respect to the uniform distribution that are optimal up to a constant factor. We prove a bound that holds for any algorithm that outputs a classifier with zero error whenever this is possible; this bound is in terms of the maximum of the doubling dimension and the VCdimension of , and strengthens the best known bound in terms of the VC-dimension alone. We show that there is no bound on the doubling dimension in terms of the VC-dimension of , in contrast with the metric dimension. (A paper about this work was published in the proceedings of NIPS’06.)
منابع مشابه
Agnostic Online learnability
We study a fundamental question. What classes of hypotheses are learnable in the online learning model? The analogous question in the PAC learning model [12] was addressed by Vapnik and others [13], who showed that the VC dimension characterizes the learnability of a hypothesis class. In his influential work, Littlestone [9] studied the online learnability of hypothesis classes, but only in the...
متن کاملPAC Learnability of a Concept Class under Non-atomic Measures: A Problem by Vidyasagar
In response to a 1997 problem of M. Vidyasagar, we state a necessary and sufficient condition for distribution-free PAC learnability of a concept class C under the family of all non-atomic (diffuse) measures on the domain Ω. Clearly, finiteness of the classical Vapnik–Chervonenkis dimension of C is a sufficient, but no longer necessary, condition. Besides, learnability of C under non-atomic mea...
متن کاملVC Dimension and Learnability of Sparse Polynomials and Rational Functions
We prove upper and lower bounds on the VC dimension of sparse univariate polynomi-als over reals, and apply these results to prove uniform learnability of sparse polynomials and rational functions. As another application we solve an open problem of Vapnik ((Vap-nik 82]) on uniform approximation of the general regression functions, a central problem of computational statistics (cf. Vapnik 82]), ...
متن کاملOnline Learning: Random Averages, Combinatorial Parameters, and Learnability
We study learnability in the online learning model. We define several complexity measures which capture the difficulty of learning in a sequential manner. Among these measures are analogues of Rademacher complexity, covering numbers and fat shattering dimension from statistical learning theory. Relationship among these complexity measures, their connection to online learning, and tools for boun...
متن کاملA generalization of the PAC learning in product probability spaces
Three notions dependent theory, VC-dimension, and PAC-learnability have been found to be closely related. In addition to the known relation among these notions in model theory, finite combinatorics and probability theory, Chernikov, Palacin, and Takeuchi found a relation between n-dependence and VCn-dimension, which are generalizations of dependence and VC-dimension respectively. We are now wor...
متن کامل